Radical Technologies

Metadata

  • Media: #Books #Books 2021
    • Author: Adam Greenfield
    • Tags: #read #nonfiction #technology #hardcopy #dystopia

Tech: Rewritten

"A tremendously intelligent and stylish book" proclaims the endorsement from the Guardian on the front cover of Radical Technologies: The Design of Everyday Life. That more or less sums up the book. It's the guide to modern, everyday life and tech rewritten to remove the shiny sales pitch and hone in on the dark underbelly. It is indeed intelligently written and dissected but it still ultimately feels like a rewrite. If you keep up with the current state of the industry and technologies (or work in tech), there is not much new here.

The other thing to be aware of is that Radical Technologies is focused on the dark side of technology. I happen to agree with many of the concerns raised, but it would be interesting to see various perspectives and arguments compared (since having only the two polar perspectives of 1984 doom and climate disaster vs roses and sunshine and hustle culture leaves a measured look at the whole picture wanting). The generally pessimistic perspective is warranted but overshadows what could have been an more thought provoking exploration of the advantages and disadvantages and who makes those calls. Instead we are a left with a straight forward take you don't really need to read the book for: tech gathering data on you (and everything) and squeezing the life out of society and the world (which can become so terrifying as to be apathetic).

The book sticks to what it set out to do: present a pessimistic overview of a broad swath of tech we use everyday. However, I couldn't help but wish there was more or at least some mention of sociological and psychological considerations as well. It's not a short book either so the pessimism got a bit tiring after a while without a richer analysis or potential solutions to engage you.

What's covered

  • Smartphones
  • Internet of things
  • Augmented reality
  • Digital fabrication
  • Cryptocurrency
  • Blockchain beyond bitcoin (best chapter in the book imo and the only one that covered anything substantially new to me)
  • Automation* (odd one out. Felt a bit shoehorned so scary points could be made but automation is not a specific technology so much as a movement or idea which deserves more a book of it's own).
  • Machine learning
  • Artificial intelligence

Overall, the thoughtful perspective can help spark new connections or cement ideas in your heads, but not what I had been expecting as someone already familiar with most of the technologies covered and their dangers. Good critique but better if you are looking for an introductory guide and stark wakeup call. If you already keep abreast of the developments and dark sides of technology, you might not find much new.

Themes

  • [[!The dark side of technology]]
  • Surveillance capitalism
  • Data /= knowledge
  • Dangers of a post-human obsession

Notes

Technology is now the mediator of the everyday.

"It is simultaneously the conduit through which our choices are delivered to us, the mirror by which we see ourselves reflected, and the lens that lets others see us on a level previously unimagined."

  • "It [technology] shapes out perceptions, conditions the choices available to us, and remakes our experience of space and time."
  • Even maps are not objective. "Our sense of the world is subtly conditioned by information that is presented to us for interested reasons, and yet does not disclose that interest."
  • If The medium is the message, how are these new mediums shaping society and our communication?
  • Greenfield also argues [[!Context lends or diminishes authority to content]], using examples of technologies such as AR that superimpose information directly on a subject. This lends an implicit claim to authority greater, perhaps, than other mediators. AR aims for transparency but any augmentive inserts itself between the wearer and the world, literally and metaphorically.

[[!We change for technology]]

  • Greenfield argues the rapid speed of technological development and push towards algorithmic optimisation means the world is increasingly being designed to match machine rather than human sensibilities. Spaces and systems are being shaped less by our own needs and more by those of the systems which supposedly serve us.
  • [[!Are we designing for humans or machines]]? Makes me think about product decisions when feasibility and tech stacks can overrule human factor considerations.
  • And what we change for may not be worth it. [[!Technology is not static]] and Greenfield points out that the ramifications of the technology we adapt to are not necessarily thought through nor are the items necessarily well made (ex: rushing out unfinished MVPs, cheap IoT devices vulnerable to hacks).

The human sense of place

"Reality is the platform we all share."

  • Your phone is not a phone, it's an expensive portal to the network/systems contingent upon said network.
  • The digital is not a mirror of physical reality and vice versa, even when it claims to be.
  • Interested to explore more about [[!digital vs physical geography]], like the physical geographical spread of ideas vs digital spread (ex: media consumption and recommendations in person vs social media).
  • "...we're both here and somewhere else at the same time, joined to everything at once yet never fully anywhere at all."
  • This can extend not just to a sense of place but a sense of time. Greenfield suggests "AR effectively endows its users with the ability to see through time" and this could be extended to other platforms or even the photos on our phones.
  • Are we offloading memory and human acts/connections to technological systems?
  • Companies maintain their own databases and, in a way, their own version of the world
  • What we see on the screen is not a shared or consistent representation of the same, relatively stable reality.

There is always a tradeoff

"...where technology is concerned, nothing happens automatically, nothing happens for free, and if you're not very, very careful, you might just wind up achieving an outcome at the widest variance with any you intended."

  • We trade privacy for convenience. Sometimes unknowingly, sometimes voluntarily. This is already a concern but stands to become even pricklier with things like AR glasses (as someone being scanned, you did not opt into this trade) and massive data collection.
  • Greenfield argues [[!The price of connection is vulnerability]]. This applies to tech (networks open up to hacks) but also social media and RL relationships.
  • Technology can try to address this with greater transparency but there is always [[!The friction between privacy and transparency]] ([[!Blockchain]]'s public ledger an interesting example of this).
  • There is always a tradeoff. Are we aware of the terms?
  • Automation offers us some perceived performance enhancement in exchange for discretion and control over the situation.
  • Greenfield argues that this relinquishing control means you never know the reasons behind many of the things that happen/are decided for you (like why you were not approved for that loan). Could this lack of clear causal links be a factor in the seeming modern increase of anxiety?

"[[!The ideology of ease]]" (Bradley Dilger)

"But the main problem with the virtual assistant is that it fosters an approach to the world that is literally thoughtless, leaving users disinclined to sit out any particularly prolonged frustration of desire, and ever less critical about the processes that result in the satisfaction of their needs and wants."

  • Greenfield postulates that [[!Interactions that disappear from sight disappear from thought]]. UX could play a huge role here. Is there danger in the mantra of 'frictionless' design and prioritizing ease of use above all else?
  • He continues that thoughtless use makes us less critical about processes as long as they satisfy our needs and wants. Amazon is great at taking advantage of this. Ease is presented as a good thing for consumers but this withdrawal from awareness actually more often serves another's interests.
  • "We live in a world in which things withdraw from awareness, silently enabling our more explicit deeds." - Graham Harman
  • According to Greenfield, we end up engineering/designing away "the hassle of choice" and this could have far reaching consequences.

Is technology making us poorer?

  • Greenfield points out that poverty could be said to be not so much the lack of things but a dependency on others to furnish basic needs. Does this mean tech is making us all poor as we become increasing reliant, even as it gives more and more?
  • Continuing from the perspective of [[!The ideology of ease]], you could argue it is making our choices poorer even as the choices seems to have improved.
  • What about cases where technology improves standards of living? Then again, the industry as a whole produces fewer jobs compared to its output than other industries which it 'disrupts'. Is it more of a benevolent overlords situation? An example of [[!Technology as colonization]]?

[[!Technology as colonization]]

  • Greenfield argues information processing has colonized everyday life. Worse, in becoming part of the everyday the interests and ambitions of those involved are not disclosed and presented as unbiased.
  • He goes further to suggest that tech's aspirations as applied to governance and cities suggest that there is only one universal correct solution to each individual or collective need and this can be found algorithmically. That idea of applying your one 'true and right' solution to a problem regardless of context is reminiscent of physical [[!Colonization]].

Does technology disrupt or cement control?

  • Greenfield argues that often the technology that thrives is what dovetails with existing ideologies. Likewise machine learning, depending on the data fed in, can reinforce (or undermine) existing perspectives.
  • "What is that intelligence other than a distillation of the way we've chosen to order our societies in the past?"
  • Path dependence - the tendency of dynamic systems to evolve in ways determined by past decisions

Potential

"[[!The purpose of a system is what it does]]." - Stafford Beer

  • [[!Blockchain]] built on ideas of freedom from state but also a powerful tool of enforcement were it to be embraced by the state.
  • Greenfield comes from viewpoint that there is no such thing as potential, "there are merely states of a system that have historically been enacted, and those that have not yet been enacted." The only way to see if a system can assume a given state is to try enacting it.
  • [[!Technology is not static]]. It is always evolving. Just because it was designed with good intentions does not mean it will always be used for such.

Data != Knowledge

"Perception itself is always already a process of editing and curation."

  • [[!Collected information does not equal knowledge]]. But so often we conflate data with knowledge.
  • Greenfield states data is never just data. There are always political, interested, or simply erroneous decisions that go into data collection and pretending otherwise is to try and gain scientific objectivity that is missing.
  • Machine learning assumes data is neutral and objective but at every stage, human judgment is involved.
  • This becomes even more tricky when we try to quantify the intangible. Can you quantify inner peace?
  • Greenfield says "the claim that anything at all is perfectly knowable is perverse" and so what AI and big data aspires to is fundamentally flawed by assuming there is a single perfect solution that can be reached.

[[!Beliefs about the future impact the present]]

"Beliefs about the shape of the future can be invoked, leveraged, even weaponized, to drive change in the present."

  • "Belief, in other words, exerts a peculiar kind of gravitation, pulling history toward it..."
  • What matters is not whether tools actually perform or data actually means anything, but whether those users believe that they do and act upon that.

[[!Machine Learning]]

"machine learning is the process by way of which algorithms are taught to recognise patterns in the world, through the automated analysis of very large data sets."

  • Measure world --> Organise data --> Synthesis information + knowledge --> ? Apply magic wisdom.
  • Patterns in big data begin to suggest questions, rather than other way around
  • Not all algorithms are executed by software
  • The more unconsciously we do something, the harder to deconstruct and encode.
  • In essence, Greenfield argues the [[!Machine Learning]]'s main goal is to teach computer's to generalize.
  • Optimise by accuracy (all tagged X actually X), precision (none of X tagged as Y), and recall (all known X identified). Which you optimise for can have radically different consequences. Do you go for false positives or false negatives, etc

The politics of technology

"If you give me six lines written by the hand of the most honest of men, I will find something in them with which to hang him." - Richelieu

  • Law emphasises after the fact, while tech seems to emphasise predictive.
  • An algorithm optimised for recall and applied to policing is illegal in the US. Presumes guilt, not innocence. US law says it's better to miss some than have false positives. But that's the opposite of what is being done.
  • Chicago police Heat List of people likely to be involved in violent crime who they check up on. Like something straight out Minority Report (and surely totally illegal?)
  • Suspicion not for anything you do or even anything you might do but because you occupy an area of interest. Is the same thing you can see in other areas like tracking on social media?
  • Algorithm blackbox also obscures it from lawkeeping. Who is accountable in these circumstances? Complications are not technical but legal and institutional. Yet they are not the ones handling it.
  • "...authorship of an algorithm to guide distribution of civic resources is inherently political, yet neither algorithms or their designers are democratically accountable".
  • Human laws executed with inhuman precision.

[[!When a measure becomes a target it ceases to become useful as a measure]] - Goodhart's Law

  • Another example of how [[!We change for technology]], "Actors who performance is subject to measurement may consciously adapt their behavior to produce metrics favorable to them in one way or another."
  • It does not matter if the data actually means anything, but whether those being measured and those using the data believe it does.
  • Japanese company tracking and rating employees' smiles.

[[!Blockchain]] & Bitcoin

  • Bitcoin mining has become domain of super rich dedicated mining rigs. Undermines entire everyman principle of distributed network? Currently two Chinese companies between them control 51% of the network (Bitcoin 51 percent attack) and could theoretically alter the ledger.
  • Bitcoin - the future or a "decentralized waste heat creator"?
  • [[!Blockchain]] eliminates the need for intermediaries in transactions. Smart contracts (Ethereum) eliminates that need for enforcing a promise.
  • Distributed Autonomous Organisations. Vitalik Buterin saw any group of people as consisting of a set of decisions
  • Is Bitcoin a currency or a commodity?

Complex systems are fragile

  • Everyday actions are now mediated by and contingent upon a wide range of obscure factors and systems.
  • We rely on access to the network to accomplish ordinary goals.
  • Tech infrastructure has a hierarchy of dependencies. Systems manage infrastructure which manage systems which... how long before it breaks?

[[!A distinction without a difference]]

  • Monopolies outsource R&D to the world, buying up companies. Startups then optimise for MVP and selling up rather than viable business.
  • Greenfield argues Monopolies/Giant tech stacks like Amazon seek to be an irremovable intermediary. Competing with others for same goal: "to mediate and monetize everyday life to the maximum possible extent."
  • "A distinction without a difference" Interesting that they all become similar. What would actually be different in a world dominated by Google or Amazon rather than Apple or vice verse? Does this lead to some of the apathy around them?

Automation & the push for post-humanism

  • "Why do so many of us seem to want to replace ourselves so very badly?"
  • What do people actually want from technology? Author argues a few (presumably companies and states) want the same thing they want from others: cheap, reliable, docile labour. Others are in search of less tangibles: meaning, order, or certainty.
  • Automation threatens jobs across spectrum.
  • What happens if there is no work? Can you experience pleasure without absence (variation, effort, or friction)?

Defies understanding

  • We generally use analogy to grapple with new concepts and ideas. Blockchain, Greenfield says, has no handy metaphors.
  • AI's otherness of thought. Aesthetic but can also be hard to grasp or deconstruct with human logic.

Misc

  • [[!Scarcity mindset]] "We've lived with scarcity for so long, have so long enshrined it at the very heart of our assumptions about value, choice, and necessity, that it's difficult to imagine the contours of a life unmarked by it."
  • East India Company as early crowdfunding (joint stock enterprises a technology to distribute risk)
  • All of these technologies are still relatively new, yet none are considered particularly remarkable or anything but normal now.
  • It's not any one technology, but how multiple technologies can work together.
  • "It is invariably the worst sort of blunder to imagine the future as a straight-line extrapolation of the present."
  • "If you're committed to a technology [...] then you are more or less compelled to find things for that technology to do, whether it works defensibly well in those roles or not."
  • In the desire to overcome/outdo the human, are we "discarding a gift we already have at hand and barely know what to do with"?

Topics to Pursue

  • Stolpersteine/'stumbling blocks' Holocaust memorial integrated in city by Gunter Demnig.
  • Stafford Beer, cyberneticist
  • Regimes of truth - Foucault
  • Smart Things: Ubiquitous Computing User Experience Design Mike Kuniavsky #bookList
  • Heidegger on Objects and Things Graham Harman #bookList
  • Ground Control: Fear and Happiness in the Twenty-first Century City Anna Minton #bookList
  • Laura Kurgan Forests of Data
  • IBM and the Holocaust Edwin Black #bookList
  • 507movements.com
  • Against Intellectual Monopoly
  • Bitcoin Mining Explained Like You're Five
  • Critical Algorithm Reading List
  • History of Neural Networks
  • The Black Box Society Frank Pasquale #bookList
  • Alex Tabarrok The Rise of Opaque Intelligence
  • The Next Rembrandt project
  • Billion Year Spree: The True History of Science Fiction Brian Aldiss

Scrapbook Concepts

  • The Quantified Self: "self-knowledge through numbers!" #factions
  • 51 Percent Attack: the bitcoin vulnerability where someone manages to get control of 51% of the network and alter the ledger #cyberpunk #adventures
  • Information brokerage #locations